Current:Home > MySupercomputers, Climate Models and 40 Years of the World Climate Research Programme -Trailblazer Capital Learning
Supercomputers, Climate Models and 40 Years of the World Climate Research Programme
SafeX Pro Exchange View
Date:2025-04-10 19:46:34
CHEYENNE, Wyoming — On the western rim of the Great Plains, a futuristic building looks out of place on the treeless range. The NCAR-Wyoming Supercomputing Center is indeed remote, two continents away from the weather forecasts it produces each day for Antarctica and millions of miles from the space weather it tracks.
And yet, it is surprisingly connected to the rapidly evolving story of Earth’s changing climate.
The supercomputer inside the concrete and steel building, dubbed Cheyenne, is one in a global network of high-capacity computers that run vast sets of calculations to simulate the workings of our planet.
Since the 1970s, when early computer models of the climate were emerging, they’ve been remarkably successful at projecting global temperature changes as greenhouse gas emissions rise. Over the years, the computers have become faster and more powerful, enabling the models to incorporate more of the influences on the climate and zoom in with finer-scale resolution to depict local climate changes.
Accurately gauging the risks posed by climate change at a local level is among the holy grails of climate modeling, and world-changing advances likely await the next generation of ultrafast supercomputers. At only three years old, Cheyenne is already set to be replaced by a successor with triple its speed or better.
Antonio Busalacchi, president of the University Corporation for Atmospheric Research, the consortium that oversees NCAR, which operates the Wyoming supercomputer in partnership with the National Science Foundation, is leading the team that is working on procuring the next Cheyenne.
Their goal for the new supercomputer is to understand our changing climate in ways that will help the world prepare for and adapt to the effects of rising global temperatures before it’s too late. That could mean knowing how much sea level will rise along each mile of coastline in 10, 20, 50 years, and how changes in precipitation patterns will affect water availability and drought in every community in the West, or the precise path and intensity of hurricanes before they strike. It’s what Busalacchi describes as “actionable science.”
“In order to deliver on that promise, we need to have better predictive tools, he said. “We’re really at sort of a juncture in our future of being able to predict the Earth as a coupled system to help answer these pressing questions that society is asking of us.”
40 Years of World Climate Research Programme
Busalacchi is the former science steering committee leader at the World Climate Research Programme, which helps set future climate research priorities and coordinates international research among thousands of scientists to target the most pressing climate science problems at hand.
This weekend, he will join 150 of the world’s leading climate experts at a symposium in San Francisco to celebrate the program’s 40th anniversary and to plan its research strategy for the next decade. It coincides with the centennial of the American Geophysical Union (AGU), which hosts the largest gathering of Earth, space and climate scientists. Like the supercomputer upgrade in Wyoming, the meeting is taking place at a critical juncture in the history of climate science, with a growing sense of looming crisis around the world.
The World Climate Research Programme has laid a path for innovation in climate modeling since it was founded in 1980 by the World Meteorological Organization and the International Council for Science. It formed with a long-range objective to seek “a better understanding of the climate system and the causes of climate variability and change.”
The program has since helped spawn some of the world’s most important climate research, from efforts to understand monsoons in Africa to future sea ice cover in the Arctic and adapting food systems to cope with global change. It targets information gaps that no single country is likely to fill on its own by bringing together scientists and computing power from around the world.
Computer modeling is at the heart of much that work.
Climate models are based on the physics, chemistry and biology of the natural world and assumptions about factors that have and will affect the climate, such as levels of greenhouse gas emissions. It turns out, they have been remarkably accurate in projecting global temperature rise dating back to the 1970s, when computing power was a fraction of what scientists have to work with today.
Earlier this week, a group of scientists published a peer-reviewed paper comparing the early climate models published between 1970 and the mid-2000s with what actually happened. They found that 14 of the 17 early models’ projections about temperature change as emissions rise were almost indistinguishable from the observed record.
Today’s computer models are far more complex, requiring supercomputers to account for everything from the forces melting Antarctica’s ice to the impact of vegetation on temperature and moisture. But there are still uncertainties, such as how aerosols impact cloud formation that could affect temperature and how and when tipping points such as loss of sea ice or thawing of permafrost will trigger faster global changes.
The next generation models—running on even more powerful supercomputers—are being designed to incorporate more detail to help answer increasingly difficult questions.
From Billions of Computations to Quintillions
Busalacchi started his career as an oceanography graduate student, and in the late 1970’s, his research took him to NCAR’s Boulder campus to use the first non-classified supercomputer.
The supercomputer he worked on as a graduate student performed about 160 million computations a second. Considered revolutionary at the time, it could model oceans, land, vegetation or the atmosphere—but not all at the same time.
In contrast, the world’s soon-to-be fastest computer coming online at the Oak Ridge National Laboratory in 2021 will perform 1.5 quintillion computations per second. Not only will computers of the future be capable of processing information on everything from the upper atmosphere to ocean currents and all the details in between—such as sea spray, dust, ice sheets and biogeochemical cycles—but it will be able to do so while capturing the ways humans influence the climate and how climate change influences humans.
Busalacchi uses the first report of the Intergovernmental Panel on Climate Change as an example of how earlier climate science offered only fuzzy portrayals of complex climate systems. He recalls how models, based on the data computers were able to process at the time, generated maps with grid squares so big they represented much of Western Europe and lacked sufficient detail to show the British Isles or the Alps.
“Then, over the succeeding decades, the resolution got smaller and smaller and smaller,” he said, which enabled scientists to distinguish the region’s mountain ranges and river valleys.
Global Race for High-Performance Computing
While many programs focused on the environment have been targeted for funding cuts under the Trump administration, that hasn’t been true for high-performance computers needed for climate research.
Busalacchi said the National Science Foundation, the largest source of NCAR’s funding, has provided around $99 million annually over the past few years to cover its research, 1,300 employees and the NCAR-Wyoming Supercomputing Center. And there’s not much concern about funding for the $30-40 million next-generation Cheyenne.
One factor driving the momentum behind high-performance computing is fierce international competition. According to the latest top-500 list for supercomputers, U.S. national laboratories host the world’s two most powerful supercomputers, with China holding the next two spots. It’s a contest that’s so intense, NCAR isn’t really trying to keep up in Wyoming. Cheyenne started in 20th place when it came online three years ago; last month it was 44th.
John Holdren, a Harvard University professor and science advisor to former president Barack Obama says there’s another important reason supercomputing gets funding support: It’s not only scientists who prize it but business and government leaders, too, who want to use it for studying genomics, energy and other complex scientific problems.
“The reason we need even better computing than we already have, is that—for the purposes of adaptation, for taking steps that reduce the damage from the changes in climate that we can no longer avoid—we need more localized information,” he said.
“We’ve only recently gotten models good enough to reliably say how what happens in South Florida is going to differ from what happens in North Florida, how what happens in Iowa is going to differ from what happens in Nebraska,” Holdren said. “So, just in the climate change domain, there’s a very strong argument for continuing to increase the computing power.”
NCAR will be looking for a replacement for Cheyenne this spring, once major vendors have new technologies in the pipeline.
“We want to make sure that when that procurement goes out, we can take advantage of the latest and greatest technology with respect to high-performance computing,” Busalacchi explained.
The fact that even the next Wyoming supercomputer won’t have a spot at the top of the world top-500 list doesn’t trouble Anke Kamrath, director of NCAR’s Computational and Information Systems Laboratory.
She calls the supercomputer contest a fight over “macho-flops” that doesn’t really capture all of the qualities that make a supercomputer valuable. Avoiding data bottlenecks, is important too, as is data storage.
‘Trying to Minimize What We Don’t Know’
At the NCAR-Wyoming Supercomputing center this fall, visitors peered through a viewing window to see the wiry guts of Cheyenne’s 1.5 petaflop predecessor, Yellowstone. Before being dismantled for sale as surplus government property, Yellowstone a third as powerful as Cheyenne while taking up more than twice as much space.
During its short life, Cheyenne’s processors have whirred dutifully through nearly 20 million jobs. Some 850 projects are currently underway, moving data along wires hidden under the floor to storage stacks that scientists around the world are accessing via the web.
Busalacchi said the modeling, observations and high-performance computing are all essential tools that climate scientists need to address the urgent challenges ahead.
“We’re trying to minimize what we don’t know,” he said.
Correction: This story has been updated to correct the description of the 1970s-era computer to 160 million computations per second.
veryGood! (3662)
Related
- Charges tied to China weigh on GM in Q4, but profit and revenue top expectations
- Minnie Driver recalls being 'devastated' by Matt Damon breakup at 1998 Oscars
- A common abortion pill will come before the US Supreme Court. Here’s how mifepristone works
- COP28 Does Not Deliver Clear Path to Fossil Fuel Phase Out
- The White House is cracking down on overdraft fees
- These states will see a minimum-wage increase in 2024: See the map
- The Supreme Court rejects an appeal over bans on conversion therapy for LGBTQ+ children
- Gunmen kill four soldiers, abduct two South Koreans in ambush in southern Nigeria
- US appeals court rejects Nasdaq’s diversity rules for company boards
- Ricardo Drue, soca music star, dies at 38: 'This is devastating'
Ranking
- The city of Chicago is ordered to pay nearly $80M for a police chase that killed a 10
- Tori Spelling and Dean McDermott’s Child Liam Undergoes Surgery
- TikTok's 'let them' theory aims to stop disappointment, FOMO. Experts say it's worth a try.
- Somalia’s president says his son didn’t flee fatal accident in Turkey and should return to court
- DoorDash steps up driver ID checks after traffic safety complaints
- Rutgers football coach Greg Schiano receives contract extension, pay increase
- Ellen DeGeneres Reflects on One of Her Final Trips with Stephen “tWitch” Boss on Anniversary of His Death
- New sanctions from the US and Britain target Hamas officials who help manage its financial network
Recommendation
Costco membership growth 'robust,' even amid fee increase: What to know about earnings release
Berkshire can’t use bribery allegations against Haslam in Pilot truck stop chain accounting dispute
Doncic, Hardaway led Mavs over Lakers 127-125 in LA’s first game since winning NBA Cup
Why dictionary.com's word of the year is hallucinate
'Squid Game' without subtitles? Duolingo, Netflix encourage fans to learn Korean
Appeals court denies Trump’s ‘presidential immunity’ argument in defamation lawsuit
A game of integrity? Golf has a long tradition of cheating and sandbagging
Stranger charged with break-in, murder in slaying of Detroit synagogue leader